Statistical Learning Theory Instructors : R . Castro and A . Singh Lecture 17 : Minimax Lower Bounds Lower Performance Bounds
نویسنده
چکیده
* An observation model, Pf , indexed by f ∈ F . Pf denotes the distribution of the data under model f . E.g. In regression and classification, this is the distribution of Z = (X1, Y1, . . . , Xn, Yn) ⊆ Z. We will assume that Pf is a probability measure on the measurable space (Z,B). * A performance metric d(., .). ≥ 0. If you have a model estimate f̂n, then the performance of that model estimate relative to the true model f is d(f̂n, f). E.g.
منابع مشابه
Localized Upper and Lower Bounds for Some Estimation Problems
We derive upper and lower bounds for some statistical estimation problems. The upper bounds are established for the Gibbs algorithm. The lower bounds, applicable for all statistical estimators, match the obtained upper bounds for various problems. Moreover, our framework can be regarded as a natural generalization of the standard minimax framework, in that we allow the performance of the estima...
متن کاملOn Bayes Risk Lower Bounds
This paper provides a general technique for lower bounding the Bayes risk of statistical estimation, applicable to arbitrary loss functions and arbitrary prior distributions. A lower bound on the Bayes risk not only serves as a lower bound on the minimax risk, but also characterizes the fundamental limit of any estimator given the prior knowledge. Our bounds are based on the notion of f -inform...
متن کاملTight Lower Bounds for Homology Inference
The homology groups of a manifold are important topological invariants that provide an algebraic summary of the manifold. These groups contain rich topological information, for instance, about the connected components, holes, tunnels and sometimes the dimension of the manifold. In earlier work [1], we have considered the statistical problem of estimating the homology of a manifold from noiseles...
متن کاملA strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimation
In statistical inference problems, we wish to obtain lower bounds on the minimax risk, that is to bound the performance of any possible estimator. A standard technique to do this involves the use of Fano’s inequality. However, recent work in an information-theoretic setting has shown that an argument based on binary hypothesis testing gives tighter converse results (error lower bounds) than Fan...
متن کاملMinimax Theory for High-dimensional Gaussian Mixtures with Sparse Mean Separation
While several papers have investigated computationally and statistically efficient methods for learning Gaussian mixtures, precise minimax bounds for their statistical performance as well as fundamental limits in high-dimensional settings are not well-understood. In this paper, we provide precise information theoretic bounds on the clustering accuracy and sample complexity of learning a mixture...
متن کامل